Goto

Collaborating Authors

 AAAI AI-Alert for Oct 13, 2020


New deep learning models: Fewer neurons, more intelligence

#artificialintelligence

An international research team from TU Wien (Vienna), IST Austria and MIT (USA) has developed a new artificial intelligence system based on the brains of tiny animals, such as threadworms. This novel AI-system can control a vehicle with just a few artificial neurons. The team says that system has decisive advantages over previous deep learning models: It copes much better with noisy input, and, because of its simplicity, its mode of operation can be explained in detail. It does not have to be regarded as a complex "black box," but it can be understood by humans. This new deep learning model has now been published in the journal Nature Machine Intelligence.

  AI-Alerts: 2020 > 2020-10 > AAAI AI-Alert for Oct 13, 2020 (1.00)
  Country:
  Genre: Research Report (0.50)
  Industry: Transportation (0.53)

Tesla Autopilot Self-Driving Beta Test Will Start Next Week, Elon Musk Confirms

International Business Times

Tesla CEO Elon Musk announced via Twitter on Monday that the company's autopilot self-driving mode would be made available in a small beta test starting next week. The closed beta-test system, which will be limited to a small pool of "expert and careful drivers," will roll out next week, Car And Driver reported. The Full Self-Driving (FSD) feature has undergone a complete reboot and is expected to carry a lot of new functionality. The rewrite also updated the autopilot's labeling software to enable it to interpret the environment in 4D instead of 2D. Based on Musk's recent descriptions, the updated software will build on its current "traffic light and stop sign control" feature and will likely add turns in intersections and integrate it fully on autopilot.

  AI-Alerts: 2020 > 2020-10 > AAAI AI-Alert for Oct 13, 2020 (1.00)
  Country: North America > United States (0.18)
  Industry: Transportation > Ground > Road (0.75)

Robot that can perform colonoscopies aims to make it less unpleasant

New Scientist - News

A robot that can perform colonoscopies may make the procedure simpler and less unpleasant. Pietro Valdastri at the University of Leeds in the UK and his colleagues have developed a robotic arm that uses a machine learning algorithm to move a flexible probe along the colon. The probe is a magnetic endoscope, a tube with a camera lens at the tip, that the robot controls via a magnet external to the body. The system can either work autonomously or be controlled by a human operator using a joystick, which pushes the endoscope tip further along the colon. The system also keeps track of the location and orientation of the endoscope inside the colon.

  AI-Alerts: 2020 > 2020-10 > AAAI AI-Alert for Oct 13, 2020 (1.00)
  Country: Europe > United Kingdom (0.37)

Earphone cameras watch your facial expressions and read your lips

New Scientist - News

A wearable device consisting of two mini-cameras mounted on earphones can recognise your facial expressions and read your lips, even if your mouth is covered. The tool – called C-Face – was developed by Cheng Zhang at Cornell University in Ithaca, New York, and his colleagues. It looks at the sides of the wearer's head and uses machine learning to accurately visualise facial expressions by analysing small changes in cheek contour lines. "With previous technology to reconstruct facial expression, you had to put a camera in front of you. But that brings a lot of limitations," says Zhang. "Right now, many people are wearing a face mask, and standard facial tracking will not work. Our technology still works because it doesn't rely on what your face looks like."


Amazon's Latest Gimmicks Are Pushing the Limits of Privacy

WIRED

At the end of September, amidst its usual flurry of fall hardware announcements, Amazon debuted two especially futuristic products within five days of each other. The first is a small autonomous surveillance drone, Ring Always Home Cam, that waits patiently inside a charging dock to eventually rise up and fly around your house, checking whether you left the stove on or investigating potential burglaries. The second is a palm recognition scanner, Amazon One, that the company is piloting at two of its grocery stores in Seattle as a mechanism for faster entry and checkout. Both products aim to make security and authentication more convenient--but for privacy-conscious consumers, they also raise red flags. Amazon's latest data-hungry innovations are not launching in a vacuum.

  AI-Alerts: 2020 > 2020-10 > AAAI AI-Alert for Oct 13, 2020 (1.00)

Deep learning enables identification and optimization of RNA-based tools for myriad applications

#artificialintelligence

DNA and RNA have been compared to "instruction manuals" containing the information needed for living "machines" to operate. But while electronic machines like computers and robots are designed from the ground up to serve a specific purpose, biological organisms are governed by a much messier, more complex set of functions that lack the predictability of binary code. Inventing new solutions to biological problems requires teasing apart seemingly intractable variables--a task that is daunting to even the most intrepid human brains. Two teams of scientists from the Wyss Institute at Harvard University and the Massachusetts Institute of Technology have devised pathways around this roadblock by going beyond human brains; they developed a set of machine learning algorithms that can analyze reams of RNA-based "toehold" sequences and predict which ones will be most effective at sensing and responding to a desired target sequence. As reported in two papers published concurrently today in Nature Communications, the algorithms could be generalizable to other problems in synthetic biology as well, and could accelerate the development of biotechnology tools to improve science and medicine and help save lives.


Going Beyond Human Brains: Deep Learning Takes On Synthetic Biology

#artificialintelligence

Work by Wyss Core Faculty member Peng Yin in collaboration with Collins and others has demonstrated that different toehold switches can be combined to compute the presence of multiple "triggers," similar to a computer's logic board. DNA and RNA have been compared to "instruction manuals" containing the information needed for living "machines" to operate. But while electronic machines like computers and robots are designed from the ground up to serve a specific purpose, biological organisms are governed by a much messier, more complex set of functions that lack the predictability of binary code. Inventing new solutions to biological problems requires teasing apart seemingly intractable variables -- a task that is daunting to even the most intrepid human brains. Two teams of scientists from the Wyss Institute at Harvard University and the Massachusetts Institute of Technology have devised pathways around this roadblock by going beyond human brains; they developed a set of machine learning algorithms that can analyze reams of RNA-based "toehold" sequences and predict which ones will be most effective at sensing and responding to a desired target sequence.


AI tool could predict how drugs will react in the body - Futurity

#artificialintelligence

You are free to share this article under the Attribution 4.0 International license. A new deep learning-based tool called Metabolic Translator may soon give researchers a better handle on how drugs in development will perform in the human body. When you take a medication, you want to know precisely what it does. Pharmaceutical companies go through extensive testing to ensure that you do. Metabolic Translator, a computational tool that predicts metabolites, the products of interactions between small molecules like drugs and enzymes could help improve the process. The new tool takes advantage of deep-learning methods and the availability of massive reaction datasets to give developers a broad picture of what a drug will do.


Live facial recognition is tracking kids suspected of being criminals

MIT Technology Review

Now a new investigation from Human Rights Watch has found that not only are children regularly added to CONARC, but the database also powers a live facial recognition system in Buenos Aires deployed by the city government. This makes the system likely the first known instance of its kind being used to hunt down kids suspected of criminal activity. "It's completely outrageous," says Hye Jung Han, a children's rights advocate at Human Rights Watch, who led the research. Buenos Aires first began trialing live facial recognition on April 24, 2019. Implemented without any public consultation, the system sparked immediate resistance.


How machine learning can help to future-proof clinical trials in the era of COVID-19

AIHub

The COVID-19 pandemic is the greatest global healthcare crisis of our generation, presenting enormous challenges to medical research, including clinical trials. Advances in machine learning are providing an opportunity to adapt clinical trials and lay the groundwork for smarter, faster and more flexible clinical trials in the future. In an article published in Statistics in Biopharmaceutical Research, an international collaboration of data scientists and pharmaceutical industry experts – led by the Director of the Cambridge Centre for AI in Medicine, Professor Mihaela van der Schaar of the University of Cambridge – describe the impact that COVID-19 is having on clinical trials, and reveal how the latest machine learning (ML) approaches can help to overcome challenges that the pandemic presents. The paper covers three areas of clinical trials in which ML can make contributions: in trials for repurposing drugs to treat COVID-19, trials for new drugs to treat COVID-19, and ongoing clinical trials for drugs unrelated to COVID-19. The team, which includes scientists from pharmaceutical companies such as Novartis, notes that'the pandemic provides an opportunity to apply novel approaches that can be used in this challenging situation.'